Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
BF16 Quantized Inference
# BF16 Quantized Inference
Deepseek V3 0324 BF16
MIT
DeepSeek-V3-0324 is a BF16 version large language model released by DeepSeek AI, suitable for quantization and inference on GPUs that do not support FP8.
Large Language Model
Transformers
D
ModelCloud
397
3
Featured Recommended AI Models
Empowering the Future, Your AI Solution Knowledge Base
English
简体中文
繁體中文
にほんご
© 2025
AIbase